Regularization With Non-convex Separable Constraints

نویسندگان

  • K. Bredies
  • D. A. Lorenz
  • Kristian Bredies
  • Dirk A. Lorenz
چکیده

We consider regularization of nonlinear ill-posed problems with constraints which are non-convex. As a special case we consider separable constraints, i.e. the regularization takes place in a sequence space and the constraint acts on each sequence element with a possibly non-convex function. We derive conditions under which such a constraint provides a regularization. Moreover, we derive estimates for the error and obtain convergence rates for vanishing noise level. Our assumptions especially cover the example of regularization with a sparsity constraint in which the p-th power with 0 < p ≤ 1 is used and we present other examples as well. In particular we derive error estimates for the error measured in the quasi-norms and obtain a convergence rate of O(δ) for the error measured in ‖ · ‖p. AMS Classification: 65J20, 46A16

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Partially separable convexly-constrained optimization with non-Lipschitzian singularities and its complexity

An adaptive regularization algorithm using high-order models is proposed for partially separable convexly constrained nonlinear optimization problems whose objective function contains non-Lipschitzian `q-norm regularization terms for q ∈ (0, 1). It is shown that the algorithm using an p-th order Taylor model for p odd needs in general at most O( −(p+1)/p) evaluations of the objective function a...

متن کامل

Improving an ADMM-like Splitting Method via Positive-Indefinite Proximal Regularization for Three-Block Separable Convex Minimization

Abstract. The augmented Lagrangian method (ALM) is fundamental for solving convex minimization models with linear constraints. When the objective function is separable such that it can be represented as the sum of more than one function without coupled variables, various splitting versions of the ALM have been well studied in the literature such as the alternating direction method of multiplier...

متن کامل

Distributed Majorization-Minimization for Laplacian Regularized Problems

We consider the problem of minimizing a block separable convex function (possibly nondifferentiable, and including constraints) plus Laplacian regularization, a problem that arises in applications including model fitting, regularizing stratified models, and multi-period portfolio optimization. We develop a distributed majorizationminimization method for this general problem, and derive a comple...

متن کامل

A Flexible ADMM Algorithm for Big Data Applications

We present a flexible Alternating Direction Method of Multipliers (F-ADMM) algorithm for solving optimization problems involving a strongly convex objective function that is separable into n ≥ 2 blocks, subject to (non-separable) linear equality constraints. The F-ADMM algorithm uses a Gauss-Seidel scheme to update blocks of variables, and a regularization term is added to each of the subproble...

متن کامل

Large-scale randomized-coordinate descent methods with non-separable linear constraints

We develop randomized block coordinate descent (CD) methods for linearly constrained convex optimization. Unlike other large-scale CD methods, we do not assume the constraints to be separable, but allow them be coupled linearly. To our knowledge, ours is the first CD method that allows linear coupling constraints, without making the global iteration complexity have an exponential dependence on ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009